Figure 8.31 shows the ACFs for 36 random numbers, 360 random numbers and 1,000 random numbers.
Left: ACF for a white noise series of 36 numbers. Middle: ACF for a white noise series of 360 numbers. Right: ACF for a white noise series of 1,000 numbers.
It seems that the larger or longer the time series the closer the critical values for autocorrelation get to zero since the range of values that show no significant difference from 0 (between the blue lines) gets successively narrower as the number of data points increases in each plot.
All of the plots look like white noise since almost none of the plots extend beyond the critical values and there is no discernible pattern in the plots.
From Hyndman:
For a white noise series, we expect 95% of the spikes in the ACF to lie within \(\pm 2/\sqrt{T}\) where \(T\) is the length of the time series.1
Since the critical values are dependent on \(T\) the longer the time series, the smaller the absolute value of the critical values will be.
A classic example of a non-stationary series is the daily closing IBM stock price series (data set ibmclose). Use R to plot the daily closing prices for IBM stock and the ACF and PACF. Explain how each plot shows that the series is non-stationary and should be differenced.
A downward trend is clearly evident in the time plot which would rule out stationarity.
The ACF plot shows that all lags are well outside the range between the critical values that would indicate that the series is not white noise and so not stationary.
Because the first lag of a PACF plot is the same as the first lag in the ACF plot, the PACF plot also shows a significant spike at lag 1 that indicates that the series is not stationary.
For the following series, find an appropriate Box-Cox transformation and order of differencing in order to obtain stationary data.
usnetelecusgdpmcopperenplanementsvisitorsIn homework 2 we examined box-cox transformations for the first four of these series and discovered that "The enplanements series is the only one of the four that shows a clear seasonality that increases with the increase in the level of the series, so it is the only one of the four series for which a Box-Cox transformation is warranted and useful.2 So we will only use a box-cox transformation on the enplanements time series and possibly the visitors time series if it seems warranted after examining plots to look for increasing seasonality.
usnetelecDescription: Annual US net electricity generation (billion kwh) for 1949-2003
## [1] 1
We still see a lot of variation after differencing only once and there is one negative spike that extends below the critical value at lag 14 so let’s see if a second difference is better.
Not surprisingly, since the ndiffs function suggested only first-order differencing, that made things even worse, so let’s see if a log transformation or box-cox transformation helps in this case even though a box-cox transformation did not seem warranted.
## [1] 2
# after log transformation and Differencing twice
usnetelec %>% log() %>% diff() %>% diff() %>% ggtsdisplay()The ndiffs function suggested second-order differencing after a log transformation, but that made things worse again, so let’s try the Box-Cox transformation…
## [1] 2
# after Box-Cox transformation and Differencing once
usnetelec %>% BoxCox(lambda = "auto") %>% diff() %>% ggtsdisplay()Interesting! Even though a Box-Cox transformation did not seem warranted in this case based on plots, after trial and error the best solution for making the usnetelec time series stationary seems to be a Box-Cox transformation with first-order differencing even though the ’ndiffs` function suggested second-order differencing with Box-Cox! However it’s only marginally better than simple first-order differencing, so for simplicity’s sake we may choose to go with the original first-order differencing only instead.
usgdpDescription: Quarterly US GDP. 1947:1 - 2006.1.
## [1] 2
After my experience with the usnetelec series, I decided to try log transformation and Box-Cox too just in case, but second-order differencing alone resulted in the best outcome.
mcopperDescription: Monthly copper prices. Copper, grade A, electrolytic wire bars/cathodes,LME,cash (pounds/ton)
Source: UNCTAD http://stats.unctad.org/Handbook.
## [1] 1
Once again I tried log transformation and Box-Cox too just in case but again first-order differencing alone resulted in the best outcome.
enplanementsDescription: "Domestic Revenue Enplanements (millions): 1996-2000.
Source: Department of Transportation, Bureau of Transportation Statistics, Air Carrier Traffic Statistic Monthly.
Here we can see a Box-Cox transformation is definitely warranted because we can see seasonal variability that increases with the increase in level.
## [1] 1
## [1] 1
The nsdiffs function recommended seasonal differencing after the Box-Cox transformation, and the ndiffs function recommends further first-order differencing which results in the following series…
# after Box-Cox transformation and Differencing once
enplanements %>% BoxCox(lambda = "auto") %>% diff(lag=12) %>% diff() %>% ggtsdisplay()cbind("enplanements" = enplanements,
"BoxCox\nTransformed" = BoxCox(enplanements, lambda = "auto"),
"Seasonally\ndifferenced" =
diff(BoxCox(enplanements, lambda = "auto"),12),
"Doubly\n differenced" =
diff(diff(BoxCox(enplanements, lambda = "auto"),12),1)) %>%
autoplot(facets=TRUE) +
xlab("Year") + ylab("") +
ggtitle("Domestic Revenue Enplanements (millions)")visitorsDescription: Monthly Australian short-term overseas visitors. May 1985-April 2005
Source: Hyndman, R.J., Koehler, A.B., Ord, J.K., and Snyder, R.D., (2008) Forecasting with exponential smoothing: the state space approach, Springer.
Again a Box-Cox transformation definitely seems to be warranted in this case due to seasonal variability that increases with the level of the series.
## [1] 1
## [1] 1
Once again, the nsdiffs function recommended seasonal differencing after the Box-Cox transformation, and the ndiffs function recommends further first-order differencing which results in the following series…
# after Box-Cox transformation and Seasonal Differencing
visitors %>% BoxCox(lambda = "auto") %>% diff(lag=12) %>% diff() %>% ggtsdisplay()cbind("visitors" = visitors,
"BoxCox\nTransformed" = BoxCox(visitors, lambda = "auto"),
"Seasonally\ndifferenced" =
diff(BoxCox(visitors, lambda = "auto"),12),
"Doubly\n differenced" =
diff(diff(BoxCox(visitors, lambda = "auto"),12),1)) %>%
autoplot(facets=TRUE) +
xlab("Year") + ylab("") +
ggtitle("Monthly Australian short-term overseas visitors")For your retail data (from Exercise 3 in Section 2.10), find the appropriate order of differencing (after transformation if necessary) to obtain stationary data.
retaildata <- readxl::read_excel("retail.xlsx", skip=1)
retail <- ts(retaildata[,"A3349335T"],
frequency=12, start=c(1982,4))
autoplot(retail)A Box-Cox transformation definitely seems to be warranted due to seasonal variability that increases with the level of the series.
## [1] 1
## [1] 1
Same as the last two time series in Exercise 8.3, the nsdiffs function recommended seasonal differencing after the Box-Cox transformation, and the ndiffs function recommends further first-order differencing which results in the following series…
# after Box-Cox transformation and Seasonal Differencing
retail %>% BoxCox(lambda = "auto") %>% diff(lag=12) %>% diff() %>% ggtsdisplay()There’s still a lot of autocorrelation in the ACF plot, but after trying a second-order seasonal dfference and second-order dfference after the seasonal differencing the plot above is still best.
cbind("Retail Sales" = retail,
"BoxCox\nTransformed" = BoxCox(retail, lambda = "auto"),
"Seasonally\ndifferenced" =
diff(BoxCox(retail, lambda = "auto"),12),
"Doubly\n differenced" =
diff(diff(BoxCox(retail, lambda = "auto"),12),1)) %>%
autoplot(facets=TRUE) +
xlab("Year") + ylab("") +
ggtitle("Retail Sales")Use R to simulate and plot some data from simple ARIMA models.
AR1 <- function(phi, y, e){
for(i in 2:100)
y[i] <- phi*y[i-1] + e[i]
return(y)
}
phi_0.6 <- AR1(0.6, y, e)
autoplot(phi_0.6)# par(mfrow=c(3,2))
phi_0.01 <- AR1(0.01, y, e) %>% autoplot() + ggtitle('phi_0.01')
phi_0.05 <- AR1(0.05, y, e) %>% autoplot() + ggtitle('phi_0.05')
phi_0.1 <- AR1(0.1, y, e) %>% autoplot() + ggtitle('phi_0.1')
phi_0.5 <- AR1(0.5, y, e) %>% autoplot() + ggtitle('phi_0.5')
phi_0.9 <- AR1(0.9, y, e) %>% autoplot() + ggtitle('phi_0.9')
phi_1.0 <- AR1(1.0, y, e) %>% autoplot() + ggtitle('phi_1.0')
grid.arrange(phi_0.01, phi_0.05, phi_0.1, phi_0.5, phi_0.9, phi_1.0,
nrow = 3)# par(mfrow=c(3,2))
theta_0.01 <- AR1(0.01, y, e) %>% autoplot() + ggtitle('theta_0.01')
theta_0.05 <- AR1(0.05, y, e) %>% autoplot() + ggtitle('theta_0.05')
theta_0.1 <- AR1(0.1, y, e) %>% autoplot() + ggtitle('theta_0.1')
theta_0.5 <- AR1(0.5, y, e) %>% autoplot() + ggtitle('theta_0.5')
theta_0.9 <- AR1(0.9, y, e) %>% autoplot() + ggtitle('theta_0.9')
theta_1.0 <- AR1(1.0, y, e) %>% autoplot() + ggtitle('theta_1.0')
grid.arrange(theta_0.01, theta_0.05, theta_0.1,
theta_0.5, theta_0.9, theta_1.0,
nrow = 3)ARIMA11 <- function(phi, theta, y, e){
for(i in 2:100)
y[i] <- phi*y[i-1] + e[i] + theta*e[i-1]
return(y)
}
phi_0.6_theta_0.6 <- ARIMA11(0.6, 0.6, y, e)AR2 <- function(phi_1, phi_2, y, e){
for(i in 3:100)
y[i] <- phi_1*y[i-1] + phi_2*y[i-2] + e[i]
return(y)
}
phi1_neg0.8_phi2_0.3 <- AR2(-0.8, 0.3, y, e)Consider wmurders, the number of women murdered each year (per 100,000 standard population) in the United States.
By studying appropriate graphs of the series in R, find an appropriate \(\text{ARIMA}(p,d,q)\) model for these data.
Should you include a constant in the model? Explain.
Write this model in terms of the backshift operator.
Fit the model using R and examine the residuals. Is the model satisfactory?
Forecast three times ahead. Check your forecasts by hand to make sure that you know how they have been calculated.
Create a plot of the series with forecasts and prediction intervals for the next three periods shown.
Does auto.arima() give the same model you have chosen? If not, which model do you think is better?
Hyndman, R.J., & Athanasopoulos, G. (2018) Forecasting: principles and practice, 2nd edition, OTexts: Melbourne, Australia. OTexts.com/fpp2. Accessed on February 23, 2020.↩